Minimizing a sum of clipped convex functions
نویسندگان
چکیده
منابع مشابه
Minimizing a sum of submodular functions
We consider the problem of minimizing a function represented as a sum of submodular terms. We assume each term allows an efficient computation of exchange capacities. This holds, for example, for terms depending on a small number of variables, or for certain cardinalitydependent terms. A naive application of submodular minimization algorithms would not exploit the existence of specialized excha...
متن کاملMinimizing the sum of many rational functions
We consider the problem of globally minimizing the sum of many rational functions over a given compact semialgebraic set. The number of terms can be large (10 to 100), the degree of each term should be small (up to 10), and the number of variables can be large (10 to 100) provided some kind of sparsity is present. We describe a formulation of the rational optimization problem as a generalized m...
متن کاملFast Alternating Linearization Methods for Minimizing the Sum of Two Convex Functions
Abstract. We present in this paper first-order alternating linearization algorithms based on an alternating direction augmented Lagrangian approach for minimizing the sum of two convex functions. Our basic methods require at most O(1/ǫ) iterations to obtain an ǫ-optimal solution, while our accelerated (i.e., fast) versions of them require at most O(1/ √ ǫ) iterations, with little change in the ...
متن کاملOn proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
In this paper we present a variant of the proximal forward-backward splitting method for solving nonsmooth optimization problems in Hilbert spaces, when the objective function is the sum of two nondifferentiable convex functions. The proposed iteration, which will be call the Proximal Subgradient Splitting Method, extends the classical projected subgradient iteration for important classes of pr...
متن کاملDistributed Optimization of Convex Sum of Non-Convex Functions
We present a distributed solution to optimizing a convex function composed of several nonconvex functions. Each non-convex function is privately stored with an agent while the agents communicate with neighbors to form a network. We show that coupled consensus and projected gradient descent algorithm proposed in [1] can optimize convex sum of non-convex functions under an additional assumption o...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Optimization Letters
سال: 2020
ISSN: 1862-4472,1862-4480
DOI: 10.1007/s11590-020-01565-4